Nonparametric Estimation of Renyi Divergence and Friends
نویسندگان
چکیده
We consider nonparametric estimation of L2, Rényi-α and Tsallis-α divergences between continuous distributions. Our approach is to construct estimators for particular integral functionals of two densities and translate them into divergence estimators. For the integral functionals, our estimators are based on corrections of a preliminary plug-in estimator. We show that these estimators achieve the parametric convergence rate of n−1/2 when the densities’ smoothness, s, are both at least d/4 where d is the dimension. We also derive minimax lower bounds for this problem which confirm that s > d/4 is necessary to achieve the n−1/2 rate of convergence. We validate our theoretical guarantees with a number of simulations.
منابع مشابه
Generalized Exponential Concentration Inequality for Renyi Divergence Estimation
Estimating divergences in a consistent way is of great importance in many machine learning tasks. Although this is a fundamental problem in nonparametric statistics, to the best of our knowledge there has been no finite sample exponential inequality convergence bound derived for any divergence estimators. The main contribution of our work is to provide such a bound for an estimator of Rényi-α d...
متن کاملInformation-theoretic bounds for exact recovery in weighted stochastic block models using the Renyi divergence
We derive sharp thresholds for exact recovery of communities in a weighted stochastic block model, where observations are collected in the form of a weighted adjacency matrix, and the weight of each edge is generated independently from a distribution determined by the community membership of its endpoints. Our main result, characterizing the precise boundary between success and failure of maxim...
متن کاملOn the complexity of estimating Rènyi divergences
This paper studies the complexity of estimating Renyi divergences of a distribution p observed from samples, with respect to a baseline distribution q known a priori. Extending the results of Acharya et al. (SODA’15) on estimating Renyi entropy, we present improved estimation techniques together with upper and lower bounds on the sample complexity. We show that, contrarily to estimating Renyi e...
متن کاملRobust Estimation in Linear Regression Model: the Density Power Divergence Approach
The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...
متن کاملAlpha-Divergence for Classification, Indexing and Retrieval (Revised 2)
Motivated by Chernoff’s bound on asymptotic probability of error we propose the alpha-divergence measure and a surrogate, the alpha-Jensen difference, for feature classification, indexing and retrieval in image and other databases. The alpha-divergence, also known as Renyi divergence, is a generalization of the Kullback-Liebler divergence and the Hellinger affinity between the probability densi...
متن کامل